Fast and robust tensor decomposition with applications to dictionary learning
نویسندگان
چکیده
We develop fast spectral algorithms for tensor decomposition that match the robustness guarantees of the best known polynomial-time algorithms for this problem based on the sum-of-squares (SOS) semidefinite programming hierarchy. Our algorithms can decompose a 4-tensor with n-dimensional orthonormal components in the presence of error with constant spectral norm (when viewed as an n2-by-n2 matrix). The running time is n5 which is close to linear in the input size n4. We also obtain algorithms with similar running time to learn sparsely-used orthogonal dictionaries even when feature representations have constant relative sparsity and non-independent coordinates. The only previous polynomial-time algorithms to solve these problem are based on solving large semidefinite programs. In contrast, our algorithms are easy to implement directly and are based on spectral projections and tensor-mode rearrangements. Orwork is inspired by recent ofHopkins, Schramm, Shi, and Steurer (STOC’16) that shows how fast spectral algorithms can achieve the guarantees of SOS for average-case problems. In this work, we introduce general techniques to capture the guarantees of SOS for worst-case problems. UC Berkeley, [email protected]. T. S. is supported by an NSF Graduate Research Fellowship (NSF award no. 1106400). Cornell University, [email protected]. D. S. is supported by a Microsoft Research Fellowship, a Alfred P. Sloan Fellowship, NSF awards (CCF-1408673,CCF-1412958,CCF-1350196), and the Simons Collaboration for Algorithms and Geometry.
منابع مشابه
Robust polynomial time tensor decomposition
Tensor decomposition has recently become an invaluable algorithmic primitive. It has seen much use in new algorithms with provable guarantees for fundamental statistics and machine learning problems. In these settings, some low-rank k-tensor A ∑r i 1 a ⊗k i which wewould like to decompose into components a1, . . . , ar ∈ n is often not directly accessible. This could happen for many reasons; a...
متن کاملConvolutional Dictionary Learning through Tensor Factorization
Tensor methods have emerged as a powerful paradigm for consistent learning of many latent variable models such as topic models, independent component analysis and dictionary learning. Model parameters are estimated via CP decomposition of the observed higher order input moments. However, in many domains, additional invariances such as shift invariances exist, enforced via models such as convolu...
متن کاملRobust Kronecker Component Analysis
Dictionary learning and component analysis models are fundamental in learning compact representations that are relevant to a given task (feature extraction, dimensionality reduction, denoising, etc.). The model complexity is encoded by means of specific structure, such as sparsity, low-rankness, or nonnegativity. Unfortunately, approaches like K-SVD that learn dictionaries for sparse coding via...
متن کاملSpeech Enhancement using Adaptive Data-Based Dictionary Learning
In this paper, a speech enhancement method based on sparse representation of data frames has been presented. Speech enhancement is one of the most applicable areas in different signal processing fields. The objective of a speech enhancement system is improvement of either intelligibility or quality of the speech signals. This process is carried out using the speech signal processing techniques ...
متن کاملTensor-Based Dictionary Learning for Multidimensional Sparse Recovery: the K-HOSVD
In many applications of compressive sensing, the dictionary providing the sparse description is partially or entirely unknown. It has been shown that dictionary learning algorithms are able to estimate the basis vectors from a set of training samples. In some applications the dictionary is multidimensional, e.g., when estimating jointly azimuth and elevation in a 2-D direction of arrival (DOA) ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2017